Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
نویسندگان
چکیده
This paper analyzes block-coordinate proximal gradient methods for minimizing the sum of a separable smooth function and (nonseparable) nonsmooth function, both which are allowed to be nonconvex. The main tool in our analysis is forward-backward envelope, serves as particularly suitable continuous real-valued Lyapunov function. Global linear convergence results established when cost satisfies Kurdyka–Łojasiewicz property without imposing convexity requirements on Two prominent special cases investigated setting regularized finite minimization sharing problem; particular, an immediate byproduct leads novel rates popular Finito/MISO algorithm nonconvex with very general sampling strategies.
منابع مشابه
A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization
We analyze stochastic gradient algorithms for optimizing nonconvex, nonsmooth finite-sum problems. In particular, the objective function is given by the summation of a differentiable (possibly nonconvex) component, together with a possibly non-differentiable but convex component. We propose a proximal stochastic gradient algorithm based on variance reduction, called ProxSVRG+. The algorithm is ...
متن کاملProximal alternating linearized minimization for nonconvex and nonsmooth problems
We introduce a proximal alternating linearized minimization (PALM) algorithm for solving a broad class of nonconvex and nonsmooth minimization problems. Building on the powerful KurdykaLojasiewicz property, we derive a self-contained convergence analysis framework and establish that each bounded sequence generated by PALM globally converges to a critical point. Our approach allows to analyze va...
متن کاملInertial proximal alternating minimization for nonconvex and nonsmooth problems
In this paper, we study the minimization problem of the type [Formula: see text], where f and g are both nonconvex nonsmooth functions, and R is a smooth function we can choose. We present a proximal alternating minimization algorithm with inertial effect. We obtain the convergence by constructing a key function H that guarantees a sufficient decrease property of the iterates. In fact, we prove...
متن کاملAccelerated Proximal Gradient Methods for Nonconvex Programming
Nonconvex and nonsmooth problems have recently received considerable attention in signal/image processing, statistics and machine learning. However, solving the nonconvex and nonsmooth optimization problems remains a big challenge. Accelerated proximal gradient (APG) is an excellent method for convex programming. However, it is still unknown whether the usual APG can ensure the convergence to a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Programming
سال: 2021
ISSN: ['0025-5610', '1436-4646']
DOI: https://doi.org/10.1007/s10107-020-01599-7